Kernel Estimator and Bandwidth Selection for Density and its Derivatives

نویسنده

  • Arsalane Chouaib Guidoum
چکیده

In statistics, the univariate kernel density estimation (KDE) is a non-parametric way to estimate the probability density function f(x) of a random variable X, is a fundamental data smoothing problem where inferences about the population are made, based on a finite data sample. This techniques are widely used in various inference procedures such as signal processing, data mining and econometrics, see e.g., Silverman [1986], Wand and Jones [1995], Jeffrey [1996], Wolfgang et all [2004], Alexandre [2009]. The kernel estimator are standard in many books with applications and computer vision, see Wolfgang [1991], Scott [1992], Bowman and Azzalini [1997], Venables and Ripley [2002], for computational complexity and with implementation in S, for an overview. Estimation of the density derivatives also comes up in various other applications like estimation of modes and inflexion points of densities, a good list of applications which require the estimation of density derivatives can be found in Singh [1977]. There already exist a number of packages that can perform kernel density estimation in R (density in R base); see for example KernSmooth [Wand and Ripley, 2013], sm [Bowman and Azzalini, 2013], np [Tristen and Jeffrey, 2008] and feature [Duong and Matt, 2013], they exist also of functions for kernel density derivative estimation (KDDE), e.g., kdde in ks package [Duong, 2007]. We introduce in this vignette a new R package kedd [Guidoum, 2015] for use with the statistical programming environment R Development Core Team [2015], which implements smoothing techniques and computing bandwidth selectors of the rth derivative of a probability density f(x) for univariate data, using several kernels functions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

A Two-Stage Plug-In Bandwidth Selection and Its Implementation in Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation

The performance of a kernel HAC estimator depends on the accuracy of the estimation of the normalized curvature, an unknown quantity in the optimal bandwidth represented as the spectral density and its derivative. This paper proposes to estimate it with a general class of kernels. The AMSE of the kernel estimator and the AMSE-optimal bandwidth are derived. It is shown that the optimal bandwidth...

متن کامل

Semiparametric Localized Bandwidth Selection in Kernel Density Estimation

Since conventional cross–validation bandwidth selection methods do not work for the case where the data considered are serially dependent, alternative bandwidth selection methods are needed. In recent years, Bayesian based global bandwidth selection methods have been proposed. Our experience shows that the use of a global bandwidth is however less suitable than using a localized bandwidth in ke...

متن کامل

The Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel

One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of  kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...

متن کامل

Estimating a density under pointwise constraints on the derivatives

Suppose we want to estimate a density at a point where we know the values of its first or higher order derivatives. In this case a given kernel estimator of the density can be modified by adding appropriately weighted kernel estimators of these derivatives. We give conditions under which the modified estimators are asymptotically normal. We also determine the optimal weights. When the highest d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015